Google Analytics Categorization

Categorizing ADC site page URL’s to more easily analyze user engagement.


Google Analytics Definitions

Page: The page shows the part of the URL after your domain name (path) when someone has viewed content on your website. For example, if someone views https://www.example.com/contact then /contact will be reported as the page inside the Behavior reports.
User: An individual visitor to the site (tracked using browser cookies)
Sessions: A single visit to the website, consisting of one or more pageviews, and any other interactions (The default session timeout is 30 minutes)
User % of Total: Users displayed as a percentage of the Total Users during the report period
Pageviews: The number of times users view a page that has the Google Analytics tracking code inserted. This covers all page views; so if a user refreshes the page, or navigates away from the page and returns, these are all counted as additional page views.
Unique Pageviews: The unique pageview is the count of all the times the page was viewed in an individual session as a single event. If a user viewed the page once in their visit or five times, the number of unique pageviews will be counted as just one
Entrances: Entrance represents the number of visits that started on a specific web page or group of web pages. I.e. the first page that someone views during a session
Bounce Rate: The Bounce Rate is Bounce measured in percentage. It represents the number of visits when users leave your site after just one page view, regardless of how long they stayed on that page. (Total Bounces divided by total visits)



Categorization Function

We will use the code and function below to categorize the Google Analytics dataset. The function takes messy character data within a dataframe and categorizes it based on a set of search string criteria. The inputs are the data frame, the column name of the messy data, a list of search strings, a list of category names (these have to be correlated), and you have the option of naming the new column.

It is important to note that the order of the search strings matters for strings that are repeats – i.e. “catalog” and “catalog/submit” will be written over so you must identify the longer string first (i.e. catalog/submit). Additionally, make sure the order of the categories list correlates with the order of the search strings.
Source: https://github.com/lenwood



Identify Search Strings and Category Names

# List of search strings -- note that the longer search strings are identified first 
search <- c("news", "portals", "about","catalogprofile", "catalogsubmit", "catalog", "training", "team", "home", "view", "submit", "profile")

# List of categories
categories <- c("News", "Portals", "About", "Summary", "Submit", "Cathome", "Training", "Team", "Home", "Dataset", "WhoMustSub", "Summary")


Create Categorization Function

# Quickly categorize a data frame with a column of messy character strings. 

# Replace "df" with your messy dataframe.

categorizeDF <- function(df, searchColName, searchList, catList, newColName="Category") {
  # create empty data frame to hold categories
  catDF <- data.frame(matrix(ncol=ncol(df), nrow=0))
  colnames(catDF) <- paste0(names(df))

  # add sequence so original order can be restored
  df$sequence <- seq(nrow(df))

  # iterate through the strings
  for (i in seq_along(searchList)) {
    rownames(df) <- NULL
    index <- grep(searchList[i], df[,which(colnames(df) == searchColName)], ignore.case=TRUE)
    tempDF <- df[index,]
    tempDF$newCol <- catList[i]
    catDF <- rbind(catDF, tempDF)
    df <- df[-index,]
  }

  # OTHER category for unmatched rows
  if (nrow(df) > 0) {
    df$newCol <- "OTHER"
    catDF <- rbind(catDF, df)
  }

  # return to the original order & remove the sequence data
  catDF <- catDF[order(catDF$sequence),]
  catDF$sequence <- NULL

  # remove row names
  rownames(catDF) <- NULL

  # set Category type to factor
  catDF$newCol <- as.factor(catDF$newCol)

  # rename the new column
  colnames(catDF)[which(colnames(catDF) == "newCol")] <- newColName
  catDF
}


Call Function and Categorize Data

# Replace "df" with messy dataframe

# Identify which column you want to categorize -- in our case with Google Analytics, we will be categorizing the "Page" column that contains messy URL strings. Additionally, you can name the new column that contains the categories (e.g. "Category").

sorted <- categorizeDF(df, "column name with messy data", search, categories, "new category column name")



Test Run of Categorization with Small Subset of Data

###### TEST DATASET ######


# Remove backslashes and other symbols from Page column (includes hyphens and periods). **** Not sure if this is necessary. Am trying to differentiate the single "/" as the ADC Homepage, and make it easier to identify search terms for the function below. 
test_users_clean <- top_30_users %>%
  mutate_all(funs(gsub("[[:punct:]]", "", .)))


# Rename home page as "home" in dataframe **NOTE that for this particular dataset the "Home" page is the top viewed page and so I put in [1]. If it is not the top viewed page you will need to determine which row the homepage is and put that row number in the brackets. *** Is there a better way to do this?? ***

test_users_clean$Page[1] <- "home"


### Categorize the page URLS in the Page column into larger categories using a function ###

## Create a list of search strings to sort through pages and a list of categories (these must be correlated) **Order matters for strings that are repeats -- i.e. "catalog" and "catalog/submit" will be written over so you must identify the longer string first (i.e. catalog/submit). 

# List of search strings
search <- c("news", "portals", "about","catalogprofile", "catalogsubmit", "catalog", "training", "team", "home", "view", "submit", "profile")

# List of categories
categories <- c("News", "Portals", "About", "Summary", "Submit", "Cathome", "Training", "Team", "Home", "Dataset", "WhoMustSub", "Summary")



## Create function [below] to categorize the messy "Page" column of the raw data frame. 
# This function takes looks at a data frame column of messy character (or factorial) data, and produces a new column of categorized data. The inputs are the data frame, the column name of the messy data, a list of search strings, a list of category names (these two have to be correlated), and you have the option of naming the new column.


# Function:
categorizeDF <- function(test_users_clean, searchColName, searchList, catList, newColName="Category") {
  # create empty data frame to hold categories
  catDF <- data.frame(matrix(ncol=ncol(test_users_clean), nrow=0))
  colnames(catDF) <- paste0(names(test_users_clean))

  # add sequence so original order can be restored
  test_users_clean$sequence <- seq(nrow(test_users_clean))

  # iterate through the strings
  for (i in seq_along(searchList)) {
    rownames(test_users_clean) <- NULL
    index <- grep(searchList[i], test_users_clean[,which(colnames(test_users_clean) == searchColName)], ignore.case=TRUE)
    tempDF <- test_users_clean[index,]
    tempDF$newCol <- catList[i]
    catDF <- rbind(catDF, tempDF)
    test_users_clean <- test_users_clean[-index,]
  }

  # OTHER category for unmatched rows
  if (nrow(test_users_clean) > 0) {
    test_users_clean$newCol <- "OTHER"
    catDF <- rbind(catDF, test_users_clean)
  }

  # return to the original order & remove the sequence data
  catDF <- catDF[order(catDF$sequence),]
  catDF$sequence <- NULL

  # remove row names
  rownames(catDF) <- NULL

  # set Category type to factor
  catDF$newCol <- as.factor(catDF$newCol)

  # rename the new column
  colnames(catDF)[which(colnames(catDF) == "newCol")] <- newColName
  catDF
}


# Call the function and create new data frame - using the raw data frame, the messy column you want to sort, the search and category lists, and name of the new column

sortedDF <- categorizeDF(test_users_clean, "Page", search, categories, "Category")


knitr::kable(sortedDF, format = "html")
Page Users Sessions Users_._of_Total Pageviews Unique_Pageviews Entrances Bounce_Rate Category
home 25436 42464 0440145 75951 55359 42443 0421957423 Home
catalog 4310 2280 0257363 12070 8734 2131 0247368421 Cathome
catalog 4130 416 0195397 33 26 19 0033653846 Cathome
data 3291 2306 0160785 19380 9507 2319 0273200347 OTHER
catalogdata 3114 3395 0139405 14923 7964 3298 0253608247 Cathome
about 2637 941 0123776 3942 3297 944 0582359192 About
team 1634 614 0110133 2554 2117 615 0684039088 Team
submit 1384 898 009936 3255 2395 901 0643652561 WhoMustSub
page0 1174 1580 0090577 5362 3813 1582 0158860759 OTHER
training 1166 892 0083537 2245 1639 892 0515695067 Training
publications 1120 431 0077705 1701 1326 432 0744779582 OTHER
share 1060 528 0072758 4532 2706 529 0357954545 OTHER
profile 989 232 0068477 1609 1338 238 061637931 Summary
qanda 912 214 0064713 1430 1181 214 0570093458 OTHER
january2019datasciencetrainingforarcticresearchers 903 1004 0061441 1359 1191 1004 0815737052 Training
datapage0 873 556 0058545 4704 2254 557 0303956835 OTHER
catalogprofile 799 193 0055914 1376 1121 188 0564766839 Summary
proposals 773 660 0053551 1187 1008 661 0762121212 OTHER
homehtm 735 800 0051402 936 817 800 056875 Home
support 729 121 0049463 1308 1004 122 058677686 OTHER
dataplans 685 384 0047672 932 827 384 0841145833 OTHER
2018datasciencetrainingforarcticresearchers 649 639 0046015 982 876 639 0723004695 Training
news201606datascienceopportunities 629 371 0044488 810 733 371 0851752022 News
upcomingdatasciencetrainingforarcticresearchers 599 612 0043066 857 767 612 0823529412 Training
catalogsubmit 582 302 0041746 1657 1196 304 0523178808 Submit
catalogportalspermafrost 548 651 0040505 874 722 650 0769585253 Portals
reconcilinghistoricalandcontemporarytrendsinterrestrialcarbonexchangeofthenorthernpermafrostzone 546 844 0039355 1189 995 844 0808056872 OTHER
viewdoi103334CDIAC00001V2017 522 562 0038272 769 619 562 0807829181 Dataset
catalogshare 521 399 0037263 1197 994 378 0483709273 Cathome
categorynews 512 197 0036317 822 672 197 0624365482 News





Visualizations for User Analysis

##  2016  2017  2018  2019  2020 
## 23772 32205 48337 31187 60330
## `summarise()` ungrouping output (override with `.groups` argument)

## $tm
##     Year         Category vSize vColor stdErr vColorValue level        x0
## 1   2016            About   366      3    368          NA     2 0.9307865
## 2   2016              API    11      3     13          NA     2 0.9809736
## 3   2016          Cathome  3022      3   3024          NA     2 0.9119412
## 4   2016        DataPlans    93      3     95          NA     2 0.9543366
## 5   2016 DataPreservation     3      3      5          NA     2 0.9982011
## 6   2016          Dataset 12211      3  12213          NA     2 0.8074770
## 7   2016          History    85      3     87          NA     2 0.9809736
## 8   2016             Home  3585      3   3587          NA     2 0.8074770
## 9   2016              KNB    11      3     13          NA     2 0.9934042
## 10  2016           MyData   178      3    180          NA     2 0.9307865
## 11  2016             <NA> 23772     25  23774          NA     1 0.8074770
## 12  2016             News   880      3    882          NA     2 0.8701616
## 13  2016            OTHER  1613      3   1615          NA     2 0.8074770
## 14  2016           Outage     1      3      3          NA     2 0.9967021
## 15  2016         Projects    38      3     40          NA     2 0.9809736
## 16  2016     Publications   170      3    172          NA     2 0.9776289
## 17  2016               QA   199      3    201          NA     2 0.9307865
## 18  2016       SearchTips     8      3     10          NA     2 0.9934042
## 19  2016          Summary   680      3    682          NA     2 0.8701616
## 20  2016          Support   131      3    133          NA     2 0.9543366
## 21  2016             Team   307      3    309          NA     2 0.9684271
## 22  2016         Training     1      3      3          NA     2 0.9983510
## 23  2016          Webinar     2      3      4          NA     2 0.9934042
## 24  2016       WhoMustSub   177      3    179          NA     2 0.9543366
## 25  2017            About   549      3    551          NA     2 0.8936335
## 26  2017              API    31      3     33          NA     2 0.9876467
## 27  2017          Cathome  1690      3   1692          NA     2 0.8248908
## 28  2017        DataPlans   106      3    108          NA     2 0.9869822
## 29  2017          Dataset 19535      3  19537          NA     2 0.5549019
## 30  2017          History    98      3    100          NA     2 0.9667187
## 31  2017             Home  4273      3   4275          NA     2 0.8248908
## 32  2017              KNB    46      3     48          NA     2 0.9667187
## 33  2017           MyData   184      3    186          NA     2 0.9426230
## 34  2017             <NA> 32205     24  32207          NA     1 0.5549019
## 35  2017             News   662      3    664          NA     2 0.8936335
## 36  2017            OTHER  2044      3   2046          NA     2 0.9433397
## 37  2017         Projects    38      3     40          NA     2 0.9876467
## 38  2017        Proposals   302      3    304          NA     2 0.9189513
## 39  2017     Publications   221      3    223          NA     2 0.9752820
## 40  2017               QA   178      3    180          NA     2 0.9426230
## 41  2017       SearchTips    14      3     16          NA     2 0.9876467
## 42  2017          Summary   804      3    806          NA     2 0.8248908
## 43  2017          Support   165      3    167          NA     2 0.9667187
## 44  2017             Team   323      3    325          NA     2 0.8936335
## 45  2017         Training   648      3    650          NA     2 0.9473851
## 46  2017          Webinar     2      3      4          NA     2 0.9984558
## 47  2017       WhoMustSub   292      3    294          NA     2 0.9426230
## 48  2018            About   627      3    629          NA     2 0.4837325
## 49  2018              API    21      3     23          NA     2 0.5383027
## 50  2018          Cathome  2454      3   2456          NA     2 0.3039637
## 51  2018        DataPlans   147      3    149          NA     2 0.5080145
## 52  2018          Dataset 26478      3  26480          NA     2 0.0000000
## 53  2018          History   105      3    107          NA     2 0.5383027
## 54  2018             Home  7559      3   7561          NA     2 0.3039637
## 55  2018              KNB     4      3      6          NA     2 0.5454167
## 56  2018           MyData   170      3    172          NA     2 0.5080145
## 57  2018             <NA> 48337     24  48339          NA     1 0.0000000
## 58  2018             News   450      3    452          NA     2 0.5251654
## 59  2018            OTHER  3828      3   3830          NA     2 0.4705433
## 60  2018         Projects    16      3     18          NA     2 0.5454167
## 61  2018        Proposals   403      3    405          NA     2 0.4837325
## 62  2018     Publications   211      3    213          NA     2 0.4837325
## 63  2018               QA   180      3    182          NA     2 0.4837325
## 64  2018       SearchTips     8      3     10          NA     2 0.5517402
## 65  2018           Submit   134      3    136          NA     2 0.5080145
## 66  2018          Summary  1324      3   1326          NA     2 0.4093999
## 67  2018          Support   150      3    152          NA     2 0.5329235
## 68  2018             Team   344      3    346          NA     2 0.5221278
## 69  2018         Training  1946      3   1948          NA     2 0.3039637
## 70  2018       WhoMustSub  1778      3   1780          NA     2 0.4093999
## 71  2019            About   588      3    590          NA     2 0.7034105
## 72  2019              API     2      3      4          NA     2 0.8038399
## 73  2019             Blog   108      3    110          NA     2 0.7746135
## 74  2019          Cathome  4522      3   4524          NA     2 0.5549019
## 75  2019        DataPlans   223      3    225          NA     2 0.7790748
## 76  2019          Dataset 14501      3  14503          NA     2 0.5549019
## 77  2019          History   103      3    105          NA     2 0.7914346
## 78  2019             Home  5289      3   5291          NA     2 0.5549019
## 79  2019              KNB     1      3      3          NA     2 0.8062646
## 80  2019             <NA> 31187     25  31189          NA     1 0.5549019
## 81  2019             News   346      3    348          NA     2 0.7034105
## 82  2019            OTHER  1893      3   1895          NA     2 0.7034105
## 83  2019          Portals   442      3    444          NA     2 0.7034105
## 84  2019         Projects    45      3     47          NA     2 0.7941411
## 85  2019        Proposals    82      3     84          NA     2 0.7746135
## 86  2019     Publications   245      3    247          NA     2 0.7478707
## 87  2019               QA   157      3    159          NA     2 0.7478707
## 88  2019       SearchTips     8      3     10          NA     2 0.7941411
## 89  2019           Submit   732      3    734          NA     2 0.7784573
## 90  2019          Summary   530      3    532          NA     2 0.7034105
## 91  2019          Support   127      3    129          NA     2 0.7478707
## 92  2019             Team   340      3    342          NA     2 0.7478707
## 93  2019         Training   577      3    579          NA     2 0.7559351
## 94  2019       WhoMustSub   326      3    328          NA     2 0.7783003
## 95  2020            About   615      3    617          NA     2 0.4478588
## 96  2020              API     2      3      4          NA     2 0.5475081
## 97  2020             Blog   701      3    703          NA     2 0.5056649
## 98  2020          Cathome  5783      3   5785          NA     2 0.4461653
## 99  2020        DataPlans   129      3    131          NA     2 0.5319809
## 100 2020          Dataset 36322      3  36324          NA     2 0.0000000
## 101 2020          History    84      3     86          NA     2 0.5319809
## 102 2020             Home  5961      3   5963          NA     2 0.3340817
## 103 2020              KNB     1      3      3          NA     2 0.5530535
## 104 2020           MyData     2      3      4          NA     2 0.5530535
## 105 2020             <NA> 60330     25  60332          NA     1 0.0000000
## 106 2020             News  1132      3   1134          NA     2 0.4478588
## 107 2020            OTHER  1814      3   1816          NA     2 0.3340817
## 108 2020          Portals  2815      3   2817          NA     2 0.3340817
## 109 2020         Projects    28      3     30          NA     2 0.5475081
## 110 2020     Publications   295      3    297          NA     2 0.5278539
## 111 2020               QA   217      3    219          NA     2 0.4940209
## 112 2020       SearchTips     7      3      9          NA     2 0.5475081
## 113 2020           Submit  1690      3   1692          NA     2 0.3929834
## 114 2020          Summary   823      3    825          NA     2 0.4478588
## 115 2020          Support   202      3    204          NA     2 0.4940209
## 116 2020             Team   398      3    400          NA     2 0.4478588
## 117 2020         Training   940      3    942          NA     2 0.5063399
## 118 2020       WhoMustSub   369      3    371          NA     2 0.4940209
##              y0           w           h   color
## 1   0.081746241 0.037640601 0.049652726 #035C47
## 2   0.000000000 0.012430589 0.004518763 #035C48
## 3   0.131398967 0.088058800 0.175242829 #035C49
## 4   0.000000000 0.026636977 0.017828572 #035C4B
## 5   0.003096774 0.001798861 0.008516129 #035C4C
## 6   0.306641796 0.192522995 0.323882274 #035C4D
## 7   0.020129033 0.019026412 0.022812904 #035C4F
## 8   0.131398967 0.104464195 0.175242829 #035C50
## 9   0.011612904 0.006595823 0.008516129 #035C51
## 10  0.000000000 0.023550065 0.038596368 #035C53
## 11  0.000000000 0.192522995 0.630524070 #146660
## 12  0.057276473 0.060624924 0.074122494 #035C54
## 13  0.000000000 0.062684617 0.131398967 #035C55
## 14  0.000000000 0.001648956 0.003096774 #035C57
## 15  0.004518763 0.012430589 0.015610271 #035C58
## 16  0.042941938 0.022371113 0.038804303 #035C59
## 17  0.038596368 0.023550065 0.043149873 #035C5B
## 18  0.003096774 0.004796962 0.008516129 #035C5C
## 19  0.000000000 0.060624924 0.057276473 #035A5C
## 20  0.017828572 0.026636977 0.025113365 #03595C
## 21  0.081746241 0.031572854 0.049652726 #03585C
## 22  0.000000000 0.001648956 0.003096774 #03565C
## 23  0.000000000 0.003297911 0.003096774 #03555C
## 24  0.042941938 0.023292276 0.038804303 #03545C
## 25  0.695671212 0.048989523 0.057225250 #007B5B
## 26  0.637137923 0.012353328 0.012814341 #007B5D
## 27  0.690247946 0.068742705 0.125538995 #007B5F
## 28  0.665660165 0.013017786 0.041580270 #007B61
## 29  0.630524070 0.269988846 0.369475930 #007B63
## 30  0.641748100 0.020927992 0.023912065 #007B65
## 31  0.815786941 0.118448902 0.184213059 #007B67
## 32  0.630524070 0.020927992 0.011224030 #007B69
## 33  0.668246481 0.024095676 0.038993953 #007B6B
## 34  0.630524070 0.445098069 0.369475930 #16897E
## 35  0.752896461 0.053751630 0.062890480 #007B6D
## 36  0.815786941 0.056660322 0.184213059 #007B6F
## 37  0.649952263 0.012353328 0.015707901 #007B70
## 38  0.630524070 0.023671737 0.065147142 #007B72
## 39  0.707240435 0.024717965 0.045656027 #007B74
## 40  0.630524070 0.024095676 0.037722411 #007B76
## 41  0.630524070 0.010809162 0.006613853 #007B78
## 42  0.630524070 0.068742705 0.059723877 #007B7A
## 43  0.665660165 0.020263534 0.041580270 #007A7B
## 44  0.630524070 0.025317785 0.065147142 #00787B
## 45  0.752896461 0.052614889 0.062890480 #00767B
## 46  0.630524070 0.001544166 0.006613853 #00757B
## 47  0.707240435 0.032659031 0.045656027 #00737B
## 48  0.135823689 0.041432915 0.077275284 #56E47A
## 49  0.000000000 0.007113948 0.015073954 #56E47C
## 50  0.094247864 0.105436232 0.118851110 #56E47F
## 51  0.022591773 0.030288171 0.024783512 #56E481
## 52  0.000000000 0.303963699 0.444817654 #56E483
## 53  0.015073954 0.016599211 0.032301330 #56E485
## 54  0.213098974 0.166579617 0.231718680 #56E488
## 55  0.000000000 0.009485263 0.002153422 #56E48A
## 56  0.047375285 0.024908922 0.034850784 #56E48C
## 57  0.000000000 0.554901931 0.444817654 #7AFDB1
## 58  0.135823689 0.029736542 0.077275284 #56E48E
## 59  0.213098974 0.084358615 0.231718680 #56E490
## 60  0.002153422 0.006323509 0.012920532 #56E493
## 61  0.082226069 0.038395302 0.053597621 #56E495
## 62  0.037853433 0.024282075 0.044372636 #56E497
## 63  0.000000000 0.024282075 0.037853433 #56E499
## 64  0.002153422 0.003161754 0.012920532 #56E49C
## 65  0.000000000 0.030288171 0.022591773 #56E49E
## 66  0.000000000 0.074332543 0.090955204 #56E4A0
## 67  0.047375285 0.021978460 0.034850784 #56E4A2
## 68  0.082226069 0.032774154 0.053597621 #56E4A5
## 69  0.000000000 0.105436232 0.094247864 #56E4A7
## 70  0.090955204 0.074332543 0.122143770 #56E4A9
## 71  0.151378004 0.052524551 0.057165438 #64D443
## 72  0.000000000 0.002424713 0.004211999 #61D443
## 73  0.021442904 0.016821137 0.032785889 #5FD443
## 74  0.000000000 0.148508573 0.155488256 #5DD443
## 75  0.054228793 0.028402158 0.040093325 #5BD443
## 76  0.337349685 0.252575074 0.293174385 #58D443
## 77  0.021442904 0.016042381 0.032785889 #56D443
## 78  0.155488256 0.148508573 0.181861429 #54D443
## 79  0.000000000 0.001212356 0.004211999 #52D443
## 80  0.000000000 0.252575074 0.630524070 #6DEC65
## 81  0.000000000 0.044460178 0.039739597 #50D443
## 82  0.208543442 0.075046814 0.128806243 #4DD443
## 83  0.039739597 0.044460178 0.050765613 #4BD443
## 84  0.004211999 0.013335920 0.017230905 #49D443
## 85  0.000000000 0.019527598 0.021442904 #47D443
## 86  0.054228793 0.031204165 0.040093325 #45D443
## 87  0.024250200 0.026742805 0.029978593 #43D443
## 88  0.000000000 0.009698851 0.004211999 #43D445
## 89  0.208543442 0.029019687 0.128806243 #43D447
## 90  0.090505210 0.044460178 0.060872794 #43D449
## 91  0.000000000 0.026742805 0.024250200 #43D44C
## 92  0.094322118 0.030429654 0.057055886 #43D44E
## 93  0.151378004 0.051541949 0.057165438 #43D450
## 94  0.094322118 0.029176669 0.057055886 #43D452
## 95  0.488844400 0.046162045 0.068031279 #132247
## 96  0.444817654 0.005545400 0.001841686 #132247
## 97  0.556875679 0.049237009 0.072701758 #132147
## 98  0.728421298 0.108736673 0.271578702 #132047
## 99  0.472442950 0.022920986 0.028739219 #131F47
## 100 0.444817654 0.334081683 0.555182346 #131E47
## 101 0.444817654 0.015527120 0.027625296 #131E47
## 102 0.728421298 0.112083574 0.271578702 #131D47
## 103 0.444817654 0.001848467 0.002762530 #131C47
## 104 0.447580184 0.001848467 0.005525059 #131B47
## 105 0.444817654 0.554901931 0.555182346 #1F254F
## 106 0.629577438 0.058481067 0.098843861 #131B47
## 107 0.444817654 0.058901763 0.157263360 #131A47
## 108 0.602081014 0.113777165 0.126340285 #131947
## 109 0.453105243 0.007393867 0.019337707 #131847
## 110 0.501182170 0.027048052 0.055693510 #131747
## 111 0.471991000 0.037960052 0.029191169 #131747
## 112 0.446659340 0.005545400 0.006445902 #131647
## 113 0.444817654 0.054875402 0.157263360 #131547
## 114 0.556875679 0.057806075 0.072701758 #131447
## 115 0.444817654 0.037960052 0.027173346 #131347
## 116 0.444817654 0.046162045 0.044026746 #141347
## 117 0.629577438 0.048562017 0.098843861 #151347
## 118 0.501182170 0.033832987 0.055693510 #151347
## 
## $type
## [1] "index"
## 
## $vSize
## [1] "Users"
## 
## $vColor
## [1] NA
## 
## $stdErr
## [1] "Users"
## 
## $algorithm
## [1] "pivotSize"
## 
## $vpCoorX
## [1] 0.02812148 0.97187852
## 
## $vpCoorY
## [1] 0.01968504 0.91031496
## 
## $aspRatio
## [1] 1.483512
## 
## $range
## [1] NA
## 
## $mapping
## [1] NA NA NA
## 
## $draw
## [1] TRUE